seo

How to Track the Evolution of Search Engine Algorithms & Why It’s Important to Do So

The vast majority of search marketers operating in the organic space at least lay claim to “following the latest algorithms” at the search engines, and in 90% of the client pitches I’ve ever heard (or made, for that matter), the subject comes up at least once. However, I think this is still a topic about which there’s not a lot of true understanding and for those new to the field, it’s probably the most daunting aspect of the work. So, to help ease some pain, I figured I’d address many of the most common questions about keeping up with the search engines’ ever-changing mathematical formulas that rank search results.


What is an Algorithm? How does it apply to the Search Results at Google, Yahoo! & MSN/Live?

An algorithm is just a complex equation (or set of equations) that, in the search engines’ case, performs a sorting task. Here’s an example of an exceptionally simple search engine algorithm:

Rank = Number of Terms * Number of Links to Page * Number of Trusted Links

In the example above, the engine ranks pages on the basis of three simple factors – the number of times the search term appears on the page, the number of links to that page and the number of “trusted” links to the page. In reality, Google has said that their algorithm contains more than 200 individual elements used to determine rank (ranking factors). The ranking factors in search engine algorithms come in two primary varieties (and dozens of offshoots) – query dependent factors and query-independent factors.

Query dependent factors are part of the sorting mechanism that’s executed when your search is submitted to the engine. The search engines don’t know what you’re about to search for, so there are many variables they can’t pre-calculate and need to do on demand. These include identifying pages that contain the keywords you’ve searched for, calculating keyword-based relevance and collecting any geographic or personalized data about you in order to serve a more targeted result. To help preserve resources, the search engines do cache an enormous number of their most popular search results at regular intervals, so as not to force these computations more than is necessary.

Query-independent factors are pieces of information a search engine knows about a given site or page before a query is ever executed. The most famous example is Google’s PageRank, which purports to measure the global popularity of a web document, based on the links that point to it. Other factors might include TrustRank (a trust-based link metric), domain association (the website a piece of content is hosted on), keyword frequency (or term weight) and freshness.

Algorithms directly impact the search results by acting as the engines’ sorting mechanism. The reason you see SEOmoz’s blog post ranking below the Google technology page and above the AMS.org page in the screenshot below is because Google’s algorithm has sorted it thusly.

Search Results for "Google Algorithm"

 

Last year, I wrote a post taking a rough guess at the macro-factors that might make up Google’s algorithm, which might serve as a helpful example of how to think about them in a non-technical fashion.

Why do SEOs Need to Pay Attention to Search Algorithms?

Because that’s how the search engines rank documents in the results, of course!

Seriously, though, if you’re a professional SEO, trying to garner more search traffic, a detailed understanding of the search engine algorithms and a thorough study of the factors that impact them is vital to your job performance. When I imagine that a time machine whisked my 2002 self forward to 2008, the litany of tragic SEO mistakes I might make probably dwarfs any value I might have brought to my 2002 campaigns. In the 6 years since I first learned about the practice of influencing the search results, the algorithms have changed to an enormous extent. Let’s take a quick look at some of the algorithmic evolution we’ve seen in the past 6 years:

  • Inherent Trust in Link Metrics
    In 2002, PageRank (yes, the little green toolbar kind) was still a kingmaker in the world of search engine rankings. With a heap of anchor text and a lots of green fairy dust, you could rank for virtually anything under the sun. When “trust” entered the picture, raw link juice mattered less and “trusted” link sources mattered more – today they’re a strong element of link evaluation.
  • Domain Trust Over The Importance of Individual Pages
    The search engines have all developed some formula for weighting a domain’s “strength” and all content on that domain benefits from its host. In 2002, we saw very little of this phenomenon, and individual pages were equally powerful with little regard for their host domain.
  • Temporal Analysis of Link Growth
    It was 2005 that Google’s patent – Information Retrieval Based on Historical Data – first opened the eyes of SEOs everywhere to Google’s use of temporality in link evaluation to determine potential spam and manipulation. Some attribute elements mentioned in this patent to the infamous Nov. 2003 “Florida” update when so many affiliate and early SEO’d sites lost their rankings.
  • Spam Identification by Anchor Text Pattern Evaluation
    Believe it or not, there was a time when having 50,000 instances of exactly the same anchor text gave you nothing but good rankings. Today, the engines are far more likely to have a very suspicious look at anyone who’s link profile stand out so un-naturally.
  • Sandboxing of New Websites
    I think the first elements of the Sandbox became noticable in March of 2004, and launching a new site hasn’t been the same since. With a harsh clampdown on new domains targeting commercial keywords that didn’t acquire a strong, trusted link profile quickly, Google eliminated an enormous amount of spam from their index (and made it a pain in the butt to help new sites and brands with SEO).
  • Fixing Blog Comment Spam
    In 2003, when I had a client who wanted to rank for a particular e-commerce term, I talked to a friend in the UK, acquired 8,000 or so links over the next 3 weeks and ranked #1 the next month. In this heyday of wonderously powerful blog links, the comment spammers would always say – “Hey, 10,000 bloggers can’t be wrong!” As with all things too-good-to-be-true, this tactic largely died as nofollow and intelligent algorithms found ways to detect which comment links to count.
  • Crackdown on Reciprocal Link Tactics
    Even as recently as 6 months ago, thousand of sites in the real estate field relied on a relatively simplistic reciprocal link exchange scheme. No more, though – hundreds of those sites never got their rankings back, and real estate SEO has a vastly different look than it did in 2007.

These are just a few of the many changes to the algorithms over the last 6 years, and only by paying attention and staying ahead of the curve could we hope to provide our clients and our own projects with the best consulting and strategic advice possible. Keeping up with algorithmic changes, particularly those that validate new techniques or invalidate old ones is not just essential to good SEO, it’s the responsibility of anyone who’s job is to market to the search engines.

How can we Research and Keep Up with the Latest Trends in Algorithmic Evolution?

There are a few good, simple tactics that enable nearly anyone to keep up with the algorithms of the major engines. They include:

#1 Maintaining several websites (or at least having access to campaign & search visitor data) provides some of the best information you can use to make informed decisions. By observing the trends in how the search engines rank and send traffic to different types of sites based on their marketing and content activities, you’ll be able to use intuitive reasoning to form hypotheses about where the engines are moving. From there, testing, tweaking and re-evaluating will give you the knowledge you seek.

#2 Reading the following excellent sources for information on a regular basis will give you a big leg up in the battle for algorithmic insight:

  • SEO By The Sea – Bill Slawski’s Blog regularly examines patent applications and IR papers for clues as to where the search engines might go next.
  • SEO Book – No one has a better pulse on successful strategies for targeting search engines than Aaron Wall
  • The Google Cache – Virante regularly performs high quality search engine testing, and this is where you’ll find the data they leak publicly

#3 Running tests using nonsense keywords and domains (and controlling for external links) also gives terrific A/B test evaluation data of what factors matter more or less to the engines’ algos. I’ve described this testing process in more detail here in the Beginner’s Guide.

How do we Apply the Knowledge Learned from Research to Real-World Campaigns?

The same way we apply any piece of knowledge that’s primarily theory – by testing and iterating. If you see strong evidence or hear from a trusted source that linking in content provides more SEO value than linking in div elements or top-level menu navigation, you might give this a try by taking a single section of your site and instituting Wikipedia-like interlinking on content pages. If, after a month, you can observe that the engines (or a single engine) has crawled all those pages and your traffic from that source rose more than normal, you might consider the effect “plausible” and try the same thing on other sections of the site.

Alternatively, you can test in the nonsense-word environments described above. This gives less realistic feedback, but doesn’t endanger anything on your sites, either 🙂

All in all, keeping up with the algorithms parallels any other optimization strategy – tax deductions, faster routes to work, better ways to chop onions, etc. Read, research, test and if you experience positive results, implement.


There’s plenty more to the practice of algorithmic research and evaluation, but we’ll save those for another post. In the meantime, I’d love to hear your thoughts on algo studies and the value you receive from it.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button